6,255 research outputs found

    Dynamic p-enrichment schemes for multicomponent reactive flows

    Full text link
    We present a family of p-enrichment schemes. These schemes may be separated into two basic classes: the first, called \emph{fixed tolerance schemes}, rely on setting global scalar tolerances on the local regularity of the solution, and the second, called \emph{dioristic schemes}, rely on time-evolving bounds on the local variation in the solution. Each class of pp-enrichment scheme is further divided into two basic types. The first type (the Type I schemes) enrich along lines of maximal variation, striving to enhance stable solutions in "areas of highest interest." The second type (the Type II schemes) enrich along lines of maximal regularity in order to maximize the stability of the enrichment process. Each of these schemes are tested over a pair of model problems arising in coastal hydrology. The first is a contaminant transport model, which addresses a declinature problem for a contaminant plume with respect to a bay inlet setting. The second is a multicomponent chemically reactive flow model of estuary eutrophication arising in the Gulf of Mexico.Comment: 29 pages, 7 figures, 3 table

    Monitoring the CMS strip tracker readout system

    Get PDF
    The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system

    Data acquisition software for the CMS strip tracker

    Get PDF
    The CMS silicon strip tracker, providing a sensitive area of approximately 200 m2 and comprising 10 million readout channels, has recently been completed at the tracker integration facility at CERN. The strip tracker community is currently working to develop and integrate the online and offline software frameworks, known as XDAQ and CMSSW respectively, for the purposes of data acquisition and detector commissioning and monitoring. Recent developments have seen the integration of many new services and tools within the online data acquisition system, such as event building, online distributed analysis, an online monitoring framework, and data storage management. We review the various software components that comprise the strip tracker data acquisition system, the software architectures used for stand-alone and global data-taking modes. Our experiences in commissioning and operating one of the largest ever silicon micro-strip tracking systems are also reviewed

    Performance of Glass Resistive Plate Chambers for a high granularity semi-digital calorimeter

    Full text link
    A new design of highly granular hadronic calorimeter using Glass Resistive Plate Chambers (GRPCs) with embedded electronics has been proposed for the future International Linear Collider (ILC) experiments. It features a 2-bit threshold semi-digital read-out. Several GRPC prototypes with their electronics have been successfully built and tested in pion beams. The design of these detectors is presented along with the test results on efficiency, pad multiplicity, stability and reproducibility.Comment: 16 pages, 15 figure

    Using XDAQ in Application Scenarios of the CMS Experiment

    Full text link
    XDAQ is a generic data acquisition software environment that emerged from a rich set of of use-cases encountered in the CMS experiment. They cover not the deployment for multiple sub-detectors and the operation of different processing and networking equipment as well as a distributed collaboration of users with different needs. The use of the software in various application scenarios demonstrated the viability of the approach. We discuss two applications, the tracker local DAQ system for front-end commissioning and the muon chamber validation system. The description is completed by a brief overview of XDAQ.Comment: Conference CHEP 2003 (Computing in High Energy and Nuclear Physics, La Jolla, CA

    Construction and commissioning of a technological prototype of a high-granularity semi-digital hadronic calorimeter

    Get PDF
    A large prototype of 1.3m3 was designed and built as a demonstrator of the semi-digital hadronic calorimeter (SDHCAL) concept proposed for the future ILC experiments. The prototype is a sampling hadronic calorimeter of 48 units. Each unit is built of an active layer made of 1m2 Glass Resistive Plate Chamber(GRPC) detector placed inside a cassette whose walls are made of stainless steel. The cassette contains also the electronics used to read out the GRPC detector. The lateral granularity of the active layer is provided by the electronics pick-up pads of 1cm2 each. The cassettes are inserted into a self-supporting mechanical structure built also of stainless steel plates which, with the cassettes walls, play the role of the absorber. The prototype was designed to be very compact and important efforts were made to minimize the number of services cables to optimize the efficiency of the Particle Flow Algorithm techniques to be used in the future ILC experiments. The different components of the SDHCAL prototype were studied individually and strict criteria were applied for the final selection of these components. Basic calibration procedures were performed after the prototype assembling. The prototype is the first of a series of new-generation detectors equipped with a power-pulsing mode intended to reduce the power consumption of this highly granular detector. A dedicated acquisition system was developed to deal with the output of more than 440000 electronics channels in both trigger and triggerless modes. After its completion in 2011, the prototype was commissioned using cosmic rays and particles beams at CERN.Comment: 49 pages, 41 figure

    A future for intelligent autonomous ocean observing systems

    Get PDF
    Ocean scientists have dreamed of and recently started to realize an ocean observing revolution with autonomous observing platforms and sensors. Critical questions to be answered by such autonomous systems are where, when, and what to sample for optimal information, and how to optimally reach the sampling locations. Definitions, concepts, and progress towards answering these questions using quantitative predictions and fundamental principles are presented. Results in reachability and path planning, adaptive sampling, machine learning, and teaming machines with scientists are overviewed. The integrated use of differential equations and theory from varied disciplines is emphasized. The results provide an inference engine and knowledge base for expert autonomous observing systems. They are showcased using a set of recent at-sea campaigns and realistic simulations. Real-time experiments with identical autonomous underwater vehicles (AUVs) in the Buzzards Bay and Vineyard Sound region first show that our predicted time-optimal paths were faster than shortest distance paths. Deterministic and probabilistic reachability and path forecasts issued and validated for gliders and floats in the northern Arabian Sea are then presented. Novel Bayesian adaptive sampling for hypothesis testing and optimal learning are finally shown to forecast the observations most informative to estimate the accuracy of model formulations, the values of ecosystem parameters and dynamic fields, and the presence of Lagrangian Coherent Structures

    Inner structure of the Puy de Dˆome volcano: cross-comparison of geophysical models (ERT, gravimetry, muon imaging)

    Get PDF
    International audienceMuon imaging of volcanoes and of geological structures in general is actively being developed by several groups in the world. It has the potential to provide 3-D density distributions with an accuracy of a few percent. At this stage of development, comparisons with established geophysical methods are useful to validate the method. An experiment has been carried out in 2011 and 2012 on a large trachytic dome, the Puy de Dˆome volcano, to perform such a comparison of muon imaging with gravimetric tomography and 2-D electrical resistivity tomography. Here, we present the preliminary results for the last two methods. North-south and east-west resistivity profiles allow us to model the resistivity distribution down to the base of the dome. The modelling of the Bouguer anomaly provides models for the density distribution within the dome that are directly comparable with the results from the muon imaging. Our ultimate goal is to derive a model of the dome using the joint interpretation of all sets of data
    • …
    corecore